Uncorrelated Group LASSO
نویسندگان
چکیده
`2,1-norm is an effective regularization to enforce a simple group sparsity for feature learning. To capture some subtle structures among feature groups, we propose a new regularization called exclusive group `2,1-norm. It enforces the sparsity at the intra-group level by using `2,1-norm, while encourages the selected features to distribute in different groups by using `2 norm at the inter-group level. The proposed exclusive group `2,1-norm is capable of eliminating the feature correlations in the context of feature selection, if highly correlated features are collected in the same groups. To solve the generic exclusive group `2,1-norm regularized problems, we propose an efficient iterative re-weighting algorithm and provide a rigorous convergence analysis. Experiment results on real world datasets demonstrate the effectiveness of the proposed new regularization and algorithm. Introduction Sparse coding starts from Lasso (R.Tibshirani 1994) using `1 norm for 2-class feature selection, and grows for `2,1 norm based multi-class feature selection. Group Lasso (Yuan and Lin 2006) incorporates feature group information into the feature learning process. The sparsity term is effective due to the inherent sparse structures of the real world data. Other extensions of spare coding includes exclusive Lasso (Zhou, Jin, and Hoi 2010), fused Lasso (Tibshirani et al. 2005), and generalized Lasso (Roth 2004), (Liu, Yuan, and Ye 2013). Due to the intuitive interpretation of sparse learning results, structural sparsity based methods have been widely applied to solve many practical problems, such as medical image analysis (Yang et al. 2010), cancer prediction (Gao and Church 2005), and gene-expression analysis (Ji et al. 2009). Among all the above methods, `2,1-norm based method as well as its variants and extensions (Liu, Ji, and Ye 2012), (Nie et al. 2010), (Yuan and Lin 2006) are considered as one of the most effective feature selection method. `2,1 norm of a matrix W ∈ <p×K is defined as
منابع مشابه
Exclusive Feature Learning on Arbitrary Structures via `1,2-norm
Group LASSO is widely used to enforce the structural sparsity, which achieves the sparsity at the inter-group level. In this paper, we propose a new formulation called “exclusive group LASSO”, which brings out sparsity at intra-group level in the context of feature selection. The proposed exclusive group LASSO is applicable on any feature structures, regardless of their overlapping or non-overl...
متن کاملExclusive Feature Learning on Arbitrary Structures via \ell_{1, 2}-norm
Group LASSO is widely used to enforce the structural sparsity, which achieves the sparsity at the inter-group level. In this paper, we propose a new formulation called “exclusive group LASSO”, which brings out sparsity at intra-group level in the context of feature selection. The proposed exclusive group LASSO is applicable on any feature structures, regardless of their overlapping or non-overl...
متن کاملStandardization and the Group Lasso Penalty.
We re-examine the original Group Lasso paper of Yuan and Lin (2007). The form of penalty in that paper seems to be designed for problems with uncorrelated features, but the statistical community has adopted it for general problems with correlated features. We show that for this general situation, a Group Lasso with a different choice of penalty matrix is generally more effective. We give insigh...
متن کاملUncorrelated Lasso
Lasso-type variable selection has increasingly expanded its machine learning applications. In this paper, uncorrelated Lasso is proposed for variable selection, where variable de-correlation is considered simultaneously with variable selection, so that selected variables are uncorrelated as much as possible. An effective iterative algorithm, with the proof of convergence, is presented to solve ...
متن کاملSupport Union Recovery in High - Dimensional Multivariate Regression
In multivariate regression, a K-dimensional response vector is regressed upon a common set of p covariates, with a matrix B∗ ∈ Rp×K of regression coefficients. We study the behavior of the multivariate group Lasso, in which block regularization based on the `1/`2 norm is used for support union recovery, or recovery of the set of s rows for which B∗ is non-zero. Under high-dimensional scaling, w...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016